Assessing quality of volunteer crowdsourcing contributions: lessons from the Cropland Capture game
نویسندگان
چکیده
Volunteered Geographical Information (VGI) is the assembly of spatial information based on public input. While VGI has proliferated in recent years, assessing the quality of volunteercontributed data has proven challenging, leading some to question the efficiency of such programs. In this paper, we compare several quality metrics for individual volunteers’ contributions. The data was the product of the ‘Cropland Capture’ game, in which several thousand volunteers assessed 165,000 images for the presence of cropland over the course of six months. We compared agreement between volunteer ratings and an image’s majority classification with volunteer self-agreement on repeated images and expert evaluations. We also examined the impact of experience and learning on performance. Volunteer selfagreement was nearly always higher than agreement with majority classifications, and much greater than agreement with expert validations, although these metrics were all positively correlated. Volunteer quality showed a broad trend toward improvement with experience, but the highest accuracies were achieved by a handful of moderately active contributors, not the most active volunteers. Our results emphasize the importance of a universal set of expertvalidated tasks as a gold standard for evaluating VGI quality.
منابع مشابه
Cropland Capture - A Game for Improving Global Cropland Maps
Current satellite-derived global land-cover products, which are crucial for many modelling and monitoring applications, show large disagreements when compared with each another. To help improve global land cover (in particular the cropland class), we developed a game called Cropland Capture. This is a simple cross-platform game for collecting image classifications that will be used to develop a...
متن کاملA global reference database of crowdsourced cropland data collected using the Geo-Wiki platform
A global reference data set on cropland was collected through a crowdsourcing campaign using the Geo-Wiki crowdsourcing tool. The campaign lasted three weeks, with over 80 participants from around the world reviewing almost 36,000 sample units, focussing on cropland identification. For quality assessment purposes, two additional data sets are provided. The first is a control set of 1,793 sample...
متن کاملVote Aggregation Techniques in the Geo-Wiki Crowdsourcing Game: A Case Study
The Cropland Capture game (CCG) aims to map cultivated lands using around 170000 satellite images. The contribution of the paper is threefold: (a) we improve the quality of the CCG’s dataset, (b) we benchmark state-of-the-art algorithms designed for an aggregation of votes in a crowdsourcing-like setting and compare the results with machine learning algorithms, (c) we propose an explanation for...
متن کامل"Dr. Detective": combining gamification techniques and crowdsourcing to create a gold standard in medical text
This paper proposes a design for a gamified crowdsourcing workflow to extract annotation from medical text. Developed in the context of a general crowdsourcing platform, Dr. Detective is a game with a purpose that engages medical experts into solving annotation tasks on medical case reports, tailored to capture disagreement between annotators. It incorporates incentives such as learning feature...
متن کاملLessons from Volunteer Management in the West Country Earthquake (Azgaleh - Sar Pul Zahab)
This article has no abstract.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Int. J. Digital Earth
دوره 9 شماره
صفحات -
تاریخ انتشار 2016